AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multilingual Mask Filling

# Multilingual Mask Filling

Multilingual ModernBert Large Preview
MIT
A large multilingual BERT model developed by the Algomatic team, supporting 8192 context length, trained on approximately 60 billion tokens, suitable for mask filling tasks.
Large Language Model
M
makiart
27
2
Multilingual ModernBert Base Preview
MIT
A multilingual BERT model developed by the Algomatic team, supporting mask-filling tasks with an 8192 context length and a vocabulary of 151,680.
Large Language Model Safetensors
M
makiart
60
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase